Bayesian Learning in Reproducing Kernel Hilbert Spaces

نویسندگان

  • Ralf Herbrich
  • Thore Graepel
چکیده

Support Vector Machines nd the hypothesis that corresponds to the centre of the largest hypersphere that can be placed inside version space, i.e. the space of all consistent hypotheses given a training set. The boundaries of version space touched by this hypersphere de ne the support vectors. An even more promising approach is to construct the hypothesis using the whole of version space. This is achieved by the Bayes point: the midpoint of the region of intersection of all hyperplanes bisecting version space into two volumes of equal magnitude. It is known that the centre of mass of version space approximates the Bayes point [30]. The centre of mass is estimated by averaging over the trajectory of a billiard in version space. We derive bounds on the generalisation error of Bayesian classi ers in terms of the volume ratio of version space and parameter space. This ratio serves as an e ective VC dimension and greatly in uences generalisation. We present experimental results indicating that Bayes Point Machines consistently outperform Support Vector Machines. Moreover, we show theoretically and experimentally how Bayes Point Machines can easily be extended to admit training errors.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Some Properties of Reproducing Kernel Banach and Hilbert Spaces

This paper is devoted to the study of reproducing kernel Hilbert spaces. We focus on multipliers of reproducing kernel Banach and Hilbert spaces. In particular, we try to extend this concept and prove some related theorems. Moreover, we focus on reproducing kernels in vector-valued reproducing kernel Hilbert spaces. In particular, we extend reproducing kernels to relative reproducing kernels an...

متن کامل

Fisher’s Linear Discriminant Analysis for Weather Data by reproducing kernel Hilbert spaces framework

Recently with science and technology development, data with functional nature are easy to collect. Hence, statistical analysis of such data is of great importance. Similar to multivariate analysis, linear combinations of random variables have a key role in functional analysis. The role of Theory of Reproducing Kernel Hilbert Spaces is very important in this content. In this paper we study a gen...

متن کامل

Solving multi-order fractional differential equations by reproducing kernel Hilbert space method

In this paper we propose a relatively new semi-analytical technique to approximate the solution of nonlinear multi-order fractional differential equations (FDEs). We present some results concerning to the uniqueness of solution of nonlinear multi-order FDEs and discuss the existence of solution for nonlinear multi-order FDEs in reproducing kernel Hilbert space (RKHS). We further give an error a...

متن کامل

Reproducing kernel Hilbert spaces of Gaussian priors

We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described through a concentration function that is expressed in the reproducing Hilbert space. Absolute co...

متن کامل

Reproducing Kernel Hilbert Spaces in Learning Theory: the Sphere and the Hypercube

We analyze the regularized least square algorithm in learning theory with Reproducing Kernel Hilbert Spaces (RKHS). Explicit convergence rates for the regression and binary classification problems are obtained in particular for the polynomial and Gaussian kernels on the n-dimensional sphere and the hypercube. There are two major ingredients in our approach: (i) a law of large numbers for Hilber...

متن کامل

From Zero to Reproducing Kernel Hilbert Spaces in Twelve Pages or Less

Reproducing Kernel Hilbert Spaces (RKHS) have been found incredibly useful in the machine learning community. Their theory has been around for quite some time and has been used in the statistics literature for at least twenty years. More recently, their application to perceptron-style algorithms, as well as new classes of learning algorithms (specially large-margin or other regularization machi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999